A new theory of radio wave propagation in ionospheric plasma density irregularities is presented. The theory is motivated by the need to quantify and understand the targetresolving capabilities and clutter characteristics of high-frequency radar systems. The theory is formulated in terms of path integrals of the ray tracing equations, which lead to expressions for the effects of the irregularities on the radar signal properties of skip distance, group delay, direction of arrival, and Doppler. The expressions are evaluated for the case of random density irregularities with a power-law wavenumber spectrum, which leads to predicted power spectra for the signal properties. The signal phase spatial-temporal autocorrelation function is also derived.